Learning Summary Statistic for Approximate Bayesian Computation via Deep Neural Network

نویسندگان

  • Wing Wong
  • Bai Jiang
  • Tung-yu Wu
  • Tung-Yu Wu
  • Charles Zheng
  • Wing H. Wong
چکیده

Approximate Bayesian Computation (ABC) methods are used to approximate posterior distributions in models with unknown or computationally intractable likelihoods. Both the accuracy and computational efficiency of ABC depend on the choice of summary statistic, but outside of special cases where the optimal summary statistics are known, it is unclear which guiding principles can be used to construct effective summary statistics. In this paper we explore the possibility of automating the process of constructing summary statistics by training deep neural networks to predict the parameters from artificially generated data: the resulting summary statistics are approximately posterior means of the parameters. With minimal model-specific tuning, our method constructs summary statistics for the Ising model and the moving-average model, which match or exceed theoreticallymotivated summary statistics in terms of the accuracies of the resulting posteriors.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Deep Generative Vision as Approximate Bayesian Computation

Probabilistic formulations of inverse graphics have recently been proposed for a variety of 2D and 3D vision problems [15, 12, 14, 9]. These approaches represent visual elements in form of graphics simulators that produce approximate renderings of the visual scenes. Existing approaches either model pixel data or hand-crafted intermediate representations such as edge maps, super-pixels, silhouet...

متن کامل

Learning Summary Statistics for Approximate Bayesian Computation

In high dimensional data, it is often very difficult to analytically evaluate the likelihood function, and thus hard to get a Bayesian posterior estimation. Approximate Bayesian Computation is an important algorithm in this application. However, to apply the algorithm, we need to compress the data into low dimensional summary statistics, which is typically hard to get in an analytical form. In ...

متن کامل

Learning from LDA Using Deep Neural Networks

Latent Dirichlet Allocation (LDA) is a three-level hierarchical Bayesian model for topic inference. In spite of its great success, inferring the latent topic distribution with LDA is time-consuming. Motivated by the transfer learning approach proposed by Hinton et al. (2015), we present a novel method that uses LDA to supervise the training of a deep neural network (DNN), so that the DNN can ap...

متن کامل

K2-ABC: Approximate Bayesian Computation with Infinite Dimensional Summary Statistics via Kernel Embeddings

Complicated generative models often result in a situation where computing the likelihood of observed data is intractable, while simulating from the conditional density given a parameter value is relatively easy. Approximate Bayesian Computation (ABC) is a paradigm that enables simulation-based posterior inference in such cases by measuring the similarity between simulated and observed data in t...

متن کامل

Bayesian Hypernetworks

We propose Bayesian hypernetworks: a framework for approximate Bayesian inference in neural networks. A Bayesian hypernetwork, h, is a neural network which learns to transform a simple noise distribution, p( ) = N (0, I), to a distribution q(θ) . = q(h( )) over the parameters θ of another neural network (the "primary network"). We train q with variational inference, using an invertible h to ena...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017